Discriminant Analysis by Locally Linear Transformations
نویسندگان
چکیده
We present a novel discriminant analysis learning method which is applicable to non-linear data structures. The method can deal with pattern classification problems which have a multi-modal distribution for each class and samples of other classes may be closer to a class than those of the class itself. Conventional linear discriminant analysis (LDA) and LDA mixture model can not solve this linearly non-separable problem. Several local linear transformations are considered to yield locally transformed classes that maximize the between-class covariance and minimize the within-class covariance. The method invloves a novel gradient based algorithm for finding the optimal set of local linear bases. It does not have a local-maxima problem and stably converges to the global maximum point. The method is computationally efficienct as compared to the previous non-linear discriminant analysis based on the kernel approach. The method does not suffer from an overfitting problem by virtue of the linear base structure of the solution. The classification results are given for both simulated data and real face data.
منابع مشابه
Facial expression recognition using local binary patterns and discriminant kernel locally linear embedding
Given the nonlinear manifold structure of facial images, a new kernel-based supervised manifold learning algorithm based on locally linear embedding (LLE), called discriminant kernel locally linear embedding (DKLLE), is proposed for facial expression recognition. The proposed DKLLE aims to nonlinearly extract the discriminant information by maximizing the interclass scatter while minimizing the...
متن کاملAddendum to: "Infinite-dimensional versions of the primary, cyclic and Jordan decompositions", by M. Radjabalipour
In his paper mentioned in the title, which appears in the same issue of this journal, Mehdi Radjabalipour derives the cyclic decomposition of an algebraic linear transformation. A more general structure theory for linear transformations appears in Irving Kaplansky's lovely 1954 book on infinite abelian groups. We present a translation of Kaplansky's results for abelian groups into the terminolo...
متن کاملLocally Linear Embedded Eigenspace Analysis
The existing nonlinear local methods for dimensionality reduction yield impressive results in data embedding and manifold visualization. However, they also open up the problem of how to define a unified projection from new data to the embedded subspace constructed by the training samples. Thinking globally and fitting locally, we present a new linear embedding approach, called Locally Embedded ...
متن کاملDiscriminant subspace learning constrained by locally statistical uncorrelation for face recognition
High-dimensionality of data and the small sample size problem are two significant limitations for applying subspace methods which are favored by face recognition. In this paper, a new linear dimension reduction method called locally uncorrelated discriminant projections (LUDP) is proposed, which addresses the two problems from a new aspect. More specifically, we propose a locally uncorrelated c...
متن کاملImproved robustness of automatic speech recognition using a new class definition in linear discriminant analysis
This work discusses the improvements which can be expected when applying linear feature-space transformations based on Linear Discriminant Analysis (LDA) within automatic speechrecognition (ASR). It is shown that different factors influence the effectiveness of LDA-transformations. Most importantly, increasing the number of LDA-classes by using time-aligned states of Hidden-Markov-Models instea...
متن کامل